|
In applied mathematics, Hessian automatic differentiation are techniques based on automatic differentiation (AD) that calculate the second derivative of a -dimensional function, known as the Hessian Matrix. When examining a function in a neighborhood of a point, one can discard many complicated global aspects of the function and accurately approximate it with simpler functions. The quadratic approximation is the best-fitting quadratic in the neighborhood of a point, and is frequently used in engineering and science. To calculate the quadratic approximation, one must first calculate its gradient and Hessian matrix. Let , for each the Hessian matrix is the second order derivative and is a symmetric matrix. See the article on Hessian matrices for more on the definition. == Reverse Hessian-vector products == For a given , this method efficiently calculates the Hessian-vector product . Thus can be used to calculate the entire Hessian by calculating , for .〔 The method works by first using forward AD to perform , subsequently the method then calculates the gradient of using Reverse AD to yield . Both of these two steps come at a time cost proportional to evaluating the function, thus the entire Hessian can be evaluated at a cost proportional to n evaluations of the function. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Hessian automatic differentiation」の詳細全文を読む スポンサード リンク
|